143 research outputs found

    A Systematic Approach to Predict Performance of Human-Automation Systems

    Get PDF
    ©2007 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.DOI: 10.1109/TSMCC.2007.897505This paper discusses an approach for predicting system performance resulting from humans and robots performing repetitive tasks in a collaborative manner. The methodology uses a systematic approach that incorporates the various effects of workload on human performance, and estimates resulting performance attributes derived between teleoperated and autonomous control of robotic systems. Performance is determined by incorporating capabilities of the human and robotic agent based on accomplishment of functional operations and effect of cognitive stress due to continuous operation by the human agent. This paper provides an overview of the prediction system and discusses its implementation on a simulated rendezvous/docking task

    Effect of a Home-Based Virtual Reality Intervention for Children with Cerebral Palsy Using Super Pop VR Evaluation Metrics: A Feasibility Study

    Get PDF
    Objective. The purpose of this pilot study was to determine whether Super Pop VR, a low-cost virtual reality (VR) system, was a feasible system for documenting improvement in children with cerebral palsy (CP) and whether a home-based VR intervention was effective. Methods. Three children with CP participated in this study and received an 8-week VR intervention (30 minutes × 5 sessions/week) using the commercial EyeToy Play VR system. Reaching kinematics measured by Super Pop VR and two fine motor tools (Bruininks-Oseretsky Test of Motor Proficiency second edition, BOT-2, and Pediatric Motor Activity Log, PMAL) were tested before, mid, and after intervention. Results. All children successfully completed the evaluations using the Super Pop VR system at home where 85% of the reaches collected were used to compute reaching kinematics, which is compatible with literature using expensive motion analysis systems. Only the child with hemiplegic CP and more impaired arm function improved the reaching kinematics and functional use of the affected hand after intervention. Conclusion. Super Pop VR proved to be a feasible evaluation tool in children with CP

    Musical Robots For Children With ASD Using A Client-Server Architecture

    Get PDF
    Presented at the 22nd International Conference on Auditory Display (ICAD-2016)People with Autistic Spectrum Disorders (ASD) are known to have difficulty recognizing and expressing emotions, which affects their social integration. Leveraging the recent advances in interactive robot and music therapy approaches, and integrating both, we have designed musical robots that can facilitate social and emotional interactions of children with ASD. Robots communicate with children with ASD while detecting their emotional states and physical activities and then, make real-time sonification based on the interaction data. Given that we envision the use of multiple robots with children, we have adopted a client-server architecture. Each robot and sensing device plays a role as a terminal, while the sonification server processes all the data and generates harmonized sonification. After describing our goals for the use of sonification, we detail the system architecture and on-going research scenarios. We believe that the present paper offers a new perspective on the sonification application for assistive technologies

    Behavior-Based Robot Navigation on Challenging Terrain: A Fuzzy Logic Approach

    Get PDF
    ©2002 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.DOI: 10.1109/TRA.2002.1019461This paper presents a new strategy for behavior-based navigation of field mobile robots on challenging terrain, using a fuzzy logic approach and a novel measure of terrain traversability. A key feature of the proposed approach is real-time assessment of terrain characteristics and incorporation of this information in the robot navigation strategy. Three terrain characteristics that strongly affect its traversability, namely, roughness, slope, and discontinuity, are extracted from video images obtained by on-board cameras. This traversability data is used to infer, in real time, the terrain Fuzzy Rule-Based Traversability Index, which succinctly quantifies the ease of traversal of the regional terrain by the mobile robot. A new traverse-terrain behavior is introduced that uses the regional traversability index to guide the robot to the safest and the most traversable terrain region. The regional traverse-terrain behavior is complemented by two other behaviors, local avoid-obstacle and global seek-goal. The recommendations of these three behaviors are integrated through adjustable weighting factors to generate the final motion command for the robot. The weighting factors are adjusted automatically, based on the situational context of the robot. The terrain assessment and robot navigation algorithms Are implemented on a Pioneer commercial robot and field-test studies are conducted. These studies demonstrate that the robot possesses intelligent decision-making capabilities that are brought to bear in negotiating hazardous terrain conditions during the robot motion

    Sensing and perception challenges of planetary surface robotics

    Get PDF
    ©2002 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Presented at IEEE Sensors 2002, Orlando, FL, June 2002.DOI: 10.1109/ICSENS.2002.1037379This expository paper describes sensing and perception issues facing the space robotics community concerned with deploying autonomous rovers on other planetary surfaces. Challenging sensing problems associated with rover surface navigation and manipulation functions are discussed for which practical solutions from sensor developers would vastly improve rover capabilities. Some practical concerns that impact sensor selection based on mass, power, and operability constraints are also discussed. The intent is to present challenges to facilitate alignment of new sensing solutions with key sensing requirements of planetary surface robotics

    Prospects of Implementing a Vhand Glove as a Robotic Controller

    Get PDF
    The Tower is an official publication of the Georgia Tech Office of Student Media and is sponsored by the Undergraduate Research Opportunities Program and the Georgia Tech Library. This article appeared in Volume 3, pages 43-51.There are numerous approaches and systems for implementing a robot controller. This project investigates the potential of using the VHand Motion Capturing Glove, developed by DGTech, as a means of controlling a programmable robot. A GUI-based application was utilized to identify and subsequently reflect the extended or closed state of each finger on the glove hand. A calibration algorithm was implemented on the existing application source code in order to increase the precision of the recognition of extended or closed finger positions as well as enhance the efficiency of the hand signal interpretation. Furthermore, manipulations were made to the scan rate and sample size of the bit signal coming from the glove to improve the accuracy of recognizing dynamic hand signals or defined signals containing sequential finger positions. An attempt was made to sync the VHand glove signals to a Scribbler robot by writing the recognized hand signals to a text file which were simultaneously read by a Python-based application. The Python application subsequently transmitted commands to the Scribbler robot via a Bluetooth serial link. However, there was difficulty in achieving real-time communication between the VHand glove and the Scribbler robot, most likely due to unidentified runtime errors in the VHand signal interpretation code.Office of Student Media; Undergraduate Research Opportunities Program; Georgia Tech Library

    Quantifying Coherence when Learning Behaviors via Teleoperation

    Get PDF
    ©2008 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Presented at the 17th IEEE International Symposium on Robot and Human Interactive Communication, Technische Universität München, Munich, Germany, August 1-3, 2008 (RO-MAN 2008)DOI: 10.1109/ROMAN.2008.4600711Applications of robotics are quickly changing. Just as computer use evolved from research purposes to everyday functions, applications of robotics are making a transition to mainstream usage. With this change in applications comes a change in the user base of robotics, and there is a pronounced move to reduce the complexity of robotic control. The move to reduce complexity is linked to the separation of the role of robot designer and robot operator. For many target applications, the operator of the robot needs to be able to correct and augment its capabilities. One method to enable this is learning from human data, which has already been successfully applied to robotics. We assert that this learning process is only viable when the demonstrated human behavior is coherent. In this work we test the hypothesis that quantifying the coherence in the provided instruction can provide useful information about the progress of the learning process. We discuss results from the application of this method to reactive behaviors. Such behaviors permit the learning process to be computationally tractable in real-time. These results support the hypothesis that coherence is important for this type of learning and also show that this property can be used to provide an avenue for self regulation of the learning process

    A 3D Virtual Environment for Exploratory Learning in Mobile Robot Control

    Get PDF
    ©2005 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.Presented at the 2005 IEEE International Conference on Systems, Man and Cybernetics, Waikoloa, Hawaii, USA, October 10-12, 2005.DOI: 10.1109/ICSMC.2005.1571163This paper discusses a virtual environment that enables human agents to develop the skills necessary to control a mobile robot through the implementation of exploratory learning practices. The interface connects the human user to both a virtual and physical robot resident in the real-world, and allows evaluation of human performance using a framework that analyzes execution parameters during human operation. The execution data is then used to compare the capability of human agents to learn the skill sets necessary to control the robot during a novel task situation. We give an overview of the environment, as well as the experimental results comparing the performance of multiple operators learning to control a virtual robot

    Approximate Reasoning for Safety and Survivability of Planetary Rovers

    Get PDF
    © 2003 Elsevier Science B.V.DOI: 10.1016/S0165-0114(02)00228-2Operational safety and health monitoring are critical matters for autonomous planetary rovers operating on remote and challenging terrain. This paper describes rover safety issues and presents an approximate reasoning approach to maintaining vehicle safety in a navigational context. The proposed rover safety module is composed of two distinct behaviors: safe attitude (pitch and roll) management and safe traction management. Fuzzy logic implementations of these behaviors on outdoor terrain is presented. Sensing of vehicle safety coupled with visual neural network-based perception of terrain quality are used to infer safe speeds during rover traversal. In addition, approximate reasoning for self-regulation of internal operating conditions is briefly discussed. The core theoretical foundations of the applied soft computing techniques is presented and supported by descriptions of field tests and laboratory experimental results. For autonomous rovers, the approach provides intrinsic safety cognizance and a capacity for reactive mitigation of navigation risks

    An intelligent terrain-based navigation system for planetary rovers

    Get PDF
    ©2001 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.DOI: 10.1109/100.973242A fuzzy logic framework for onboard terrain analysis and guidance towards traversable regions. An onboard terrain-based navigation system for mobile robots operating on natural terrain is presented. This system utilizes a fuzzy-logic framework for onboard analysis of the terrain and develops a set of fuzzy navigation rules that guide the rover toward the safest and the most traversable regions. The overall navigation strategy deals with uncertain knowledge about the environment and uses the onboard terrain analysis to enable the rover to select easy-to-traverse paths to the goal autonomously. The navigation system is tested and validated with a set of physical rover experiments and demonstrates the autonomous capability of the system
    corecore